Properties of Persistent Mutual Information and Emergence

نویسنده

  • Peter Gmeiner
چکیده

The persistent mutual information (PMI) is a complexity measure for stochastic processes. It is related to well-known complexity measures like excess entropy or statistical complexity. Essentially it is a variation of the excess entropy so that it can be interpreted as a specific measure of system internal memory. The PMI was first introduced in 2010 by Ball, Diakonova and MacKay as a measure for (strong) emergence [Bal10]. In this paper we define the PMI mathematically and investigate the relation to excess entropy and statistical complexity. In particular we prove that the excess entropy is an upper bound of the PMI. Furthermore we show some properties of the PMI and calculate it explicitly for some example processes. We also discuss to what extend it is a measure for emergence and compare it with alternative approaches used to formalize emergence.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Quantifying Emergence in Terms of Persistent Mutual Information

We define Persistent Mutual Information (PMI) as the Mutual (Shannon) Information between the past history of a system and its evolution significantly later in the future. This quantifies how much past observations enable long-term prediction, which we propose as the primary signature of (Strong) Emergent Behavior. The key feature of our definition of PMI is the omission of an interval of “pres...

متن کامل

Research of Blind Signals Separation with Genetic Algorithm and Particle Swarm Optimization Based on Mutual Information

Blind source separation technique separates mixed signals blindly without any information on the mixing system. In this paper, we have used two evolutionary algorithms, namely, genetic algorithm and particle swarm optimization for blind source separation. In these techniques a novel fitness function that is based on the mutual information and high order statistics is proposed. In order to evalu...

متن کامل

On Classification of Bivariate Distributions Based on Mutual Information

Among all measures of independence between random variables, mutual information is the only one that is based on information theory. Mutual information takes into account of all kinds of dependencies between variables, i.e., both the linear and non-linear dependencies. In this paper we have classified some well-known bivariate distributions into two classes of distributions based on their mutua...

متن کامل

A Novel Subsampling Method for 3D Multimodality Medical Image Registration Based on Mutual Information

Mutual information (MI) is a widely used similarity metric for multimodality image registration. However, it involves an extremely high computational time especially when it is applied to volume images. Moreover, its robustness is affected by existence of local maxima. The multi-resolution pyramid approaches have been proposed to speed up the registration process and increase the accuracy of th...

متن کامل

Performance Evaluation of Closed Ended Mutual Funds in Pakistan

Mutual funds are the best tool to mobilize savings and investments in an economy and Pakistan is the pioneer in South Asia, but this industry is not as much mature in comparison to its age in Pakistan. This paper examines the performance of closed ended mutual funds in Pakistan by using five different ranking measures during a period of January 2009 to December 2013 and the sample consists of o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1210.5058  شماره 

صفحات  -

تاریخ انتشار 2012